1,353 research outputs found

    Soft Scalable Self-Reconfigurable Modular Cellbot

    Get PDF
    Hazardous environments such as disaster affected areas, outer space, and radiation affected areas are dangerous for humans. Autonomous systems which can navigate through these environments would reduce risk of life. The terrains in these applications are diverse and unknown, hence there is a requirement for a robot which can self-adapt its morphology and use suitable control to optimally move in the desired manner. Although there exist monolithic robots for some of these applications, such as the Curiosity rover for Mars exploration, a modular robot containing multiple simple units could increase the fault tolerance. A modular design also enables scaling up or down of the robot based on the current task, for example, scaling up by connecting multiple units to cover a wider area or scaling down to pass through a tight space.Taking bio-inspiration from cells, where – based on environmental conditions – cells come together to form different structures to carry out different tasks, a soft modular robot called Cellbot was developed which was composed of multiple units called ‘cells’. Tests were conducted to understand the cellbot movement over different frictional surfaces for different actuation functions, the number of cells connected in a line (1D), and the shapes formed by connecting cells in 2D. A simulation model was developed to test a large range of frictional values and actuation functions for different friction coefficients. Based on the obtained results, cells could be designed using a material with frictional properties lying in the optimal locomotion range. In other cases, where the application has diverse terrains, the number of connected units can be changed to optimise the robot locomotion. Initial tests were conducted using a ‘ball robot’, where the cellbot was designed using balls which touch ground to exploit friction and actuators to provide force to move the robot. The model was extended to develop, a ‘bellow robot’ which was fabricated using hyper-elastic bellows and employed pneumatic actuation. The amount of inflation of a cell and its neighbouring cells determined if the cell would touch the ground or be lifted up. This was used to change cell behaviour where a cell could be touching ground to provide anchoring friction, or lifted to push or pull the cells and thereby move the robot. The cells were connected by magnets which could be disconnected and reconnected by morphing the robot body. The cellbot can thus reconfigure by changing the number of connected units or its shape. The easy detachment can be used to remove and replace damaged cells. Complex cellbot movements can be achieved by either switching between different robot morphologies or by changing actuation control.Future cellbots will be controlled remotely to change their morphology, control, and number of connected cells, making them suitable for missions which require fault tolerance and autonomous shape adaptation. The proposed cellbot platform has the potential to reduce the energy, time and costs in comparison to traditional robots and has potential for applications such as exploration missions for outer space, search and rescue missions for disaster affected areas, internal medical procedures, and nuclear decommissioning.<br/

    XOR Binary Gravitational Search Algorithm with Repository: Industry 4.0 Applications

    Get PDF
    Industry 4.0 is the fourth generation of industry which will theoretically revolutionize manufacturing methods through the integration of machine learning and artificial intelligence approaches on the factory floor to obtain robustness and sped-up process changes. In particular, the use of the digital twin in a manufacturing environment makes it possible to test such approaches in a timely manner using a realistic 3D environment that limits incurring safety issues and danger of damage to resources. To obtain superior performance in an industry 4.0 setup, a modified version of a binary gravitational search algorithm is introduced which benefits from an exclusive or (XOR) operator and a repository to improve the exploration property of the algorithm. Mathematical analysis of the proposed optimization approach is performed which resulted in two theorems which show that the proposed modification to the velocity vector can direct particles to the best particles. The use of repository in this algorithm provides a guideline to direct the particles to the best solutions more rapidly. The proposed algorithm is evaluated on some benchmark optimization problems covering a diverse range of functions including unimodal and multimodal as well as those which suffer from multiple local minima. The proposed algorithm is compared against several existing binary optimization algorithms including existing versions of a binary gravitational search algorithm, improved binary optimization, binary particle swarm optimization, binary grey wolf optimization and binary dragonfly optimization. To show that the proposed approach is an effective method to deal with real world binary optimization problems raised in an industry 4.0 environment, it is then applied to optimize the assembly task of an industrial robot assembling an industrial calculator. The optimal movements obtained are then implemented on a real robot. Furthermore, the digital twin of a universal robot is developed, and its path planning is done in the presence of obstacles using the proposed optimization algorithm. The obtained path is then inspected by human expert and validated. It is shown that the proposed approach can effectively solve such optimization problems which arises in industry 4.0 environment

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for heavy resonances decaying to two Higgs bosons in final states containing four b quarks

    Get PDF
    A search is presented for narrow heavy resonances X decaying into pairs of Higgs bosons (H) in proton-proton collisions collected by the CMS experiment at the LHC at root s = 8 TeV. The data correspond to an integrated luminosity of 19.7 fb(-1). The search considers HH resonances with masses between 1 and 3 TeV, having final states of two b quark pairs. Each Higgs boson is produced with large momentum, and the hadronization products of the pair of b quarks can usually be reconstructed as single large jets. The background from multijet and t (t) over bar events is significantly reduced by applying requirements related to the flavor of the jet, its mass, and its substructure. The signal would be identified as a peak on top of the dijet invariant mass spectrum of the remaining background events. No evidence is observed for such a signal. Upper limits obtained at 95 confidence level for the product of the production cross section and branching fraction sigma(gg -> X) B(X -> HH -> b (b) over barb (b) over bar) range from 10 to 1.5 fb for the mass of X from 1.15 to 2.0 TeV, significantly extending previous searches. For a warped extra dimension theory with amass scale Lambda(R) = 1 TeV, the data exclude radion scalar masses between 1.15 and 1.55 TeV

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    Measurement of the top quark mass using charged particles in pp collisions at root s=8 TeV

    Get PDF
    Peer reviewe

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (μ̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ¯ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ¯ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),μ̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| &lt; 0.03 at 95% confidence level. [Figure not available: see fulltext.

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Measurement of the Splitting Function in &ITpp &ITand Pb-Pb Collisions at root&ITsNN&IT=5.02 TeV

    Get PDF
    Data from heavy ion collisions suggest that the evolution of a parton shower is modified by interactions with the color charges in the dense partonic medium created in these collisions, but it is not known where in the shower evolution the modifications occur. The momentum ratio of the two leading partons, resolved as subjets, provides information about the parton shower evolution. This substructure observable, known as the splitting function, reflects the process of a parton splitting into two other partons and has been measured for jets with transverse momentum between 140 and 500 GeV, in pp and PbPb collisions at a center-of-mass energy of 5.02 TeV per nucleon pair. In central PbPb collisions, the splitting function indicates a more unbalanced momentum ratio, compared to peripheral PbPb and pp collisions.. The measurements are compared to various predictions from event generators and analytical calculations.Peer reviewe
    corecore